Neural network music composition by prediction: Exploring the benefits of
psychophysical constraints and multiscale processing
In algorithmic music composition, a simple technique involves selecting notes
sequentially according to a transition table that specifies the probability of
the next note as a function of the previous context. I describe an extension
of this transition table approach using a recurrent autopredictive
connectionist network called CONCERT. CONCERT is trained on a set of pieces
with the aim of extracting stylistic regularities. CONCERT can then be used
to compose new pieces. A central ingredient of CONCERT is the incorporation of
psychologically-grounded representations of pitch, duration, and harmonic
structure. CONCERT was tested on sets of examples artificially generated
according to simple rules and was shown to learn the underlying structure,
even where other approaches failed. In larger experiments, CONCERT was trained
on sets of J. S. Bach pieces and traditional European folk melodies
and was then allowed to compose novel melodies. Although the compositions
are occasionally pleasant, and are preferred over compositions generated
by a third-order transition table, the compositions suffer from a lack of
global coherence. To overcome this limitation, several methods are explored
to permit CONCERT to induce structure at both fine and coarse scales. In
experiments with a training set of waltzes, these methods yielded limited
success, but the overall results cast doubt on the promise of note-by-note
prediction for composition.
pdf of paper
Compositions based on Bach:
(mp3) (wma)
Compositions based on English folk tunes:
(mp3) (wma)
Compositions based on waltzes:
(mp3) (wma)